Using Layer Recurrent Neural Network to Generate Pseudo Random Number Sequences

نویسندگان

  • Veena Desai
  • Ravindra Patil
  • Dandina Rao
چکیده

Pseudo Random Number’s (PRN’s) are required for many cryptographic applications. This paper proposes a new method for generating PRN’s using Layer Recurrent Neural Network (LRNN). The proposed technique generates PRN’s from the weight matrix obtained from the layer weights of the LRNN. The LRNN random number generator (RNG) uses a short keyword as a seed and generates a long sequence as a pseudo PRN sequence. The number of bits generated in the PRN sequence depends on the number of neurons in the input layer of the LRNN. The generated PRN sequence changes, with a change in the training function of the LRNN .The sequences generated are a function of the keyword, initial state of network and the training function. In our implementation the PRN sequences have been generated using 3 training functions: 1) Scaled Gradient Descent 2) LevenbergMarquartz (TRAINLM) and 3) TRAINBGF. The generated sequences are tested for randomness using ENT and NIST test suites. The ENT test can be applied for sequences of small size. NIST has 16 tests to test random numbers. The LRNN generated PRN’s pass in 11 tests, show no observations for 4 tests, and fail in 1 test when subjected to NIST .This paper presents the test results for random number sequence ranging from 25 bits to 1000 bits, generated using LRNN.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Can Dynamic Neural Filters Produce Pseudo-Random Sequences?

Dynamic neural filters (DNFs) are recurrent networks of binary neurons. Under proper conditions of their synaptic matrix they are known to generate exponentially large cycles. We show that choosing the synaptic matrix to be a random orthogonal one, the average cycle length becomes close to that of a random map. We then proceed to investigate the inversion problem and argue that such a DNF could...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

Pulse Waveform Synthesis Using Recurrent Complex Valued Neural Networks

Abstract Experiment of time sequential pulse train synthesis using a layered and partially recurrent complex valued neural network is reported A half of the three layer complex valued neural network is used to generate sinusoidal oscillation and the other half to synthesize adaptively the intended pulse shapes and sequences Stable time sequential pulse signals are obtained after completion of l...

متن کامل

A Recurrent Neural Network to Identify Efficient Decision Making Units in Data Envelopment Analysis

In this paper we present a recurrent neural network model to recognize efficient Decision Making Units(DMUs) in Data Envelopment Analysis(DEA). The proposed neural network model is derived from an unconstrained minimization problem. In theoretical aspect, it is shown that the proposed neural network is stable in the sense of lyapunov and globally convergent. The proposed model has a single-laye...

متن کامل

Bounds on Sparsity of One-Hidden-Layer Perceptron Networks

Limitations of one-hidden-layer (shallow) perceptron networks to sparsely represent multivariable functions is investigated. A concrete class of functions is described whose computation by shallow perceptron networks requires either large number of units or is unstable due to large output weights. The class is constructed using pseudo-noise sequences which have many features of random sequences...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012